We present a novel approach for the reconstruction of dynamic geometricshapes using a single hand-held consumer-grade RGB-D sensor at real-time rates.Our method does not require a pre-defined shape template to start with andbuilds up the scene model from scratch during the scanning process. Geometryand motion are parameterized in a unified manner by a volumetric representationthat encodes a distance field of the surface geometry as well as the non-rigidspace deformation. Motion tracking is based on a set of extracted sparse colorfeatures in combination with a dense depth-based constraint formulation. Thisenables accurate tracking and drastically reduces drift inherent to standardmodel-to-depth alignment. We cast finding the optimal deformation of space as anon-linear regularized variational optimization problem by enforcing localsmoothness and proximity to the input constraints. The problem is tackled inreal-time at the camera's capture rate using a data-parallel flip-flopoptimization strategy. Our results demonstrate robust tracking even for fastmotion and scenes that lack geometric features.
展开▼